Proposal for an extension of negentropy by Kullback-Leibler information. 4th report. Capacity of negentropy.

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information and Negentropy: a basis for Ecoinformatics

This paper is an attempt to develop the new discipline of ecodynamics as a quest for evolutionary physics and ecoinformatics. Particular attention is devoted to goal functions, to relation of conceptualizations surrounding matter, energy, space and time and to the interdisciplinary approach connecting thermodynamics and biology. The evolutionary dynamics of complex systems, ranging from open ph...

متن کامل

Maximum Negentropy Beamforming

In this paper, we address an adaptive beamforming application based on the capture of far-field speech data from a single speaker in a real meeting room. After the position of a speaker is estimated by a speaker tracking system, we construct a subband-domain beamformer in generalized sidelobe canceller (GSC) configuration. In contrast to conventional practice, we then optimize the active weight...

متن کامل

entropy, negentropy, and information

evaluation criteria for different versions of the same database the concept of information, during its development, is connected to the concept of entropy created by the 19th century termodynamics scholars. information means, in this view point, order or negentropy. on the other hand, entropy is connected to the concepts as chaos and noise, which cause, in turn, disorder. in the present paper, ...

متن کامل

Bootstrap Estimate of Kullback-leibler Information for Model Selection Bootstrap Estimate of Kullback-leibler Information for Model Selection

Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...

متن کامل

Alternative Kullback-Leibler information entropy for enantiomers.

In our series of studies on quantifying chirality, a new chirality measure is proposed in this work based on the Kullback-Leibler information entropy. The index computes the extra information that the shape function of one enantiomer carries over a normalized shape function of the racemate, while in our previous studies the shape functions of the R and S enantiomers were used considering one as...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: TRANSACTIONS OF THE JAPAN SOCIETY OF MECHANICAL ENGINEERS Series B

سال: 1987

ISSN: 0387-5016,1884-8346

DOI: 10.1299/kikaib.53.2863